Memory gradient method with Goldstein line search

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A memory gradient method without line search for unconstrained optimization

Memory gradient methods are used for unconstrained optimization, especially large scale problems. The first idea of memory gradient methods was proposed by Miele and Cantrell (1969) and subsequently extended by Cragg and Levy (1969). Recently Narushima and Yabe (2006) proposed a new memory gradient method which generates a descent search direction for the objective function at every iteration a...

متن کامل

The Gradient Projection Method with Exact Line Search

The gradient projection algorithm for function minimization is often implemented using an approximate local minimization along the projected negative gradient. On the other hand, for some difficult combinational optimization problems, where a starting guess may be far from a solution, it may be advantageous to perform a nonlocal (exact) line search. In this paper we show how to evaluate the pie...

متن کامل

On the Goldstein - Levitin - Polyak Gradient Projection Method DIMITRI

This paper considers some aspects of a gradient projection method proposed by Goldstein [l], Levitin and Polyak [3], and more recently, in a less general context, by McCormick [lo]. We propose and analyze some convergent step-size rules to be used in conjunction with the method. These rules are similar in spirit to the efficient Armijo rule for the method of steepest descent and under mild assu...

متن کامل

Some global convergence properties of the Wei-Yao-Liu conjugate gradient method with inexact line search

In [Z.Wei, S. Yao, L. Liu, The convergence properties of some new conjugate gradient methods, Applied Mathematics and Computation 183 (2006) 1341–1350], Wei et al. proposed a new conjugate gradient method called WYL method which has good numerical experiments and some excellent properties such as b k P 0. In this paper, we prove that while tk 6 1 c 2L kgkk kdkk , the sufficient descent conditio...

متن کامل

A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search

A new nonlinear conjugate gradient method and an associated implementation, based on an inexact line search, are proposed and analyzed. With exact line search, our method reduces to a nonlinear version of the Hestenes–Stiefel conjugate gradient scheme. For any (inexact) line search, our scheme satisfies the descent condition gT kdk ≤ − 7 8 ‖gk‖. Moreover, a global convergence result is establis...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Computers & Mathematics with Applications

سال: 2007

ISSN: 0898-1221

DOI: 10.1016/j.camwa.2007.02.001